From Label Smoothing to Label Relaxation

نویسندگان

چکیده

Regularization of (deep) learning models can be realized at the model, loss, or data level. As a technique somewhere in-between loss and data, label smoothing turns deterministic class labels into probability distributions, for example by uniformly distributing certain part mass over all classes. A predictive model is then trained on these distributions as targets, using cross-entropy function. While this method has shown improved performance compared to non-smoothed cross-entropy, we argue that use smoothed though still precise distribution target questioned from theoretical perspective. an alternative, propose generalized called relaxation, in which set probabilities represented terms upper distribution. This leads genuine relaxation instead distortion, thereby reducing risk incorporating undesirable bias process. Methodically, minimization novel type function, suitable closed-form expression optimization. The effectiveness approach demonstrated empirical study image data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

To Re(label), or Not To Re(label)

One of the most popular uses of crowdsourcing is to provide training data for supervised machine learning algorithms. Since human annotators often make errors, requesters commonly ask multiple workers to label each example. But is this strategy always the most cost effective use of crowdsourced workers? We argue “No” — often classifiers can achieve higher accuracies when trained with noisy “uni...

متن کامل

Learning from Label Preferences

In this paper, we review the framework of learning (from) label preferences, a particular instance of preference learning. Following an introduction to the learning setting, we particularly focus on our own work, which addresses this problem via the learning by pairwise comparison paradigm. From a machine learning point of view, learning by pairwise comparison is especially appealing as it deco...

متن کامل

Convex Relaxation for Multilabel Problems with Product Label Spaces

Convex relaxations for continuous multilabel problems have attracted a lot of interest recently [1–5]. Unfortunately, in previous methods, the runtime and memory requirements scale linearly in the total number of labels, making them very inefficient and often unapplicable for problems with higher dimensional label spaces. In this paper, we propose a reduction technique for the case that the lab...

متن کامل

Learning from Noisy Label Distributions

In this paper, we consider a novel machine learning problem, that is, learning a classifier from noisy label distributions. In this problem, each instance with a feature vector belongs to at least one group. Then, instead of the true label of each instance, we observe the label distribution of the instances associated with a group, where the label distribution is distorted by an unknown noise. ...

متن کامل

Lagrangean Relaxation Bounds for Point-feature Cartographic Label Placement Problem

The objective of the point-feature cartographic label placement problem (PFCLP) is to give more legibility to an automatic map creation, placing point labels in clear positions. Many researchers consider distinct approaches for PFCLP, such as to obtain the maximum number of labeled points that can be placed without overlapping or to obtain the maximum number of labeled points without overlaps c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i10.17041